158 research outputs found

    Quorum Sensing in the Context of Food Microbiology

    Get PDF
    Food spoilage may be defined as a process that renders a product undesirable or unacceptable for consumption and is the outcome of the biochemical activity of a microbial community that eventually dominates according to the prevailing ecological determinants. Although limited information are reported, this activity has been attributed to quorum sensing (QS). Consequently, the potential role of cell-to-cell communication in food spoilage and food safety should be more extensively elucidated. Such information would be helpful in designing approaches for manipulating these communication systems, thereby reducing or preventing, for instance, spoilage reactions or even controlling the expression of virulence factors. Due to the many reports in the literature on the fundamental features of QS, e.g., chemistry and definitions of QS compounds, in this minireview, we only allude to the types and chemistry of QS signaling molecules per se and to the (bioassay-based) methods of their detection and quantification, avoiding extensive documentation. Conversely, we attempt to provide insights into (i) the role of QS in food spoilage, (ii) the factors that may quench the activity of QS in foods and review the potential QS inhibitors that might “mislead” the bacterial coordination of spoilage activities and thus may be used as biopreservatives, and (iii) the future experimental approaches that need to be undertaken in order to explore the “gray” or “black” areas of QS, increase our understanding of how QS affects microbial behavior in foods, and assist in finding answers as to how we can exploit QS for the benefit of food preservation and food safety

    Organic acids for control of Salmonella in different feed materials

    Get PDF
    Background: Salmonella control in animal feed is important in order to protect animal and public health. Organic acids is one of the control measures used for treatment of Salmonella contaminated feed or feed ingredients. In the present study, the efficacy of formic acid (FA) and different blends of FA, propionic acid (PA) and sodium formate (SF) was investigated. Four Salmonella strains isolated from feed were assayed for their acid tolerance. Also, the effect of lower temperatures (5°C and 15°C) compared to room temperature was investigated in rape seed and soybean meal. Results: The efficacy of acid treatments varied significantly between different feed materials. The strongest reduction was seen in pelleted and compound mash feed (2.5 log10 reduction) followed by rapeseed meal (1 log10 reduction) after 5 days exposure. However, in soybean meal the acid effects were limited (less than 0.5 log10 reduction) even after several weeks’ exposure. In all experiments the survival curves showed a concave shape, with a fast initial death phase followed by reduction at a slower rate during the remaining time of the experiment. No difference in Salmonella reduction was observed between FA and a blend of FA and PA, whereas a commercial blend of FA and SF (Amasil) was slightly more efficacious (0.5-1 log10 reduction) than a blend of FA and PA (Luprocid) in compound mash feed. The Salmonella Infantis strain was found to be the most acid tolerant strain followed by, S. Putten, S. Senftenberg and S. Typhimurium. The tolerance of the S. Infantis strain compared with the S. Typhimurium strain was statistically significant (p<0.05). The lethal effect of FA on the S. Typhimurium strain and the S. Infantis strain was lower at 5°C and 15°C compared to room temperatures. Conclusions: Acid treatment of Salmonella in feed is a matter of reducing the number of viable bacterial cells rather than eliminating the organism. Recommendations on the use of acids for controlling Salmonella in feed should take into account the relative efficacy of acid treatment in different feed materials, the variation in acid tolerance between different Salmonella strains, and the treatment temperature

    Virulence Gene Sequencing Highlights Similarities and Differences in Sequences in Listeria monocytogenes Serotype 1/2a and 4b Strains of Clinical and Food Origin From 3 Different Geographic Locations

    Get PDF
    peer-reviewedThe Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fmicb.2018.01103/full#supplementary-materialThe prfA-virulence gene cluster (pVGC) is the main pathogenicity island in Listeria monocytogenes, comprising the prfA, plcA, hly, mpl, actA, and plcB genes. In this study, the pVGC of 36 L. monocytogenes isolates with respect to different serotypes (1/2a or 4b), geographical origin (Australia, Greece or Ireland) and isolation source (food-associated or clinical) was characterized. The most conserved genes were prfA and hly, with the lowest nucleotide diversity (π) among all genes (P < 0.05), and the lowest number of alleles, substitutions and non-synonymous substitutions for prfA. Conversely, the most diverse gene was actA, which presented the highest number of alleles (n = 20) and showed the highest nucleotide diversity. Grouping by serotype had a significantly lower π value (P < 0.0001) compared to isolation source or geographical origin, suggesting a distinct and well-defined unit compared to other groupings. Among all tested genes, only hly and mpl were those with lower nucleotide diversity in 1/2a serotype than 4b serotype, reflecting a high within-1/2a serotype divergence compared to 4b serotype. Geographical divergence was noted with respect to the hly gene, where serotype 4b Irish strains were distinct from Greek and Australian strains. Australian strains showed less diversity in plcB and mpl relative to Irish or Greek strains. Notable differences regarding sequence mutations were identified between food-associated and clinical isolates in prfA, actA, and plcB sequences. Overall, these results indicate that virulence genes follow different evolutionary pathways, which are affected by a strain's origin and serotype and may influence virulence and/or epidemiological dominance of certain subgroups.This study was supported by the 7th Framework Programme projects PROMISE, contract number 265877

    Short-term effects of a low glycemic index carob-containing snack on energy intake, satiety, and glycemic response in normal-weight, healthy adults: Results from two randomized trials

    Get PDF
    Background/Objectives: The potential positive health effects of carob containing snacks are largely unknown. Therefore, two studies were conducted to 1.firstly determine the glycemic index (GI) of a carob-snack compared to chocolate cookie containing equal amounts of available carbohydrates and 2.compare the effects of a carob vs. chocolate cookie preload consumed as snack before a meal on (a) short-term satiety response measured by subsequent ad libitum meal intake, (b) subjective satiety as assessed by visual analogue scales (VAS), and (c) postprandial glycemic response. Subjects/ Methods: Ten healthy, normal-weight volunteers participated in GI investigation. Then, 50 healthy, normal-weight subjects consumed, cross-over, in random order, the preloads as snack, with one-week wash-out period. Ad-libitum meal (lunch and dessert) was offered. Capillary blood glucose samples were collected at baseline, 2h-after breakfast-and-just-before-preload consumption, 2h-after-preload, 3h-after-preload and-just-before-meal-(lunch-and-dessert),1h-after-meal and 2h-after-meal consumption. Results The carob snack was low and chocolate cookie high GI foods (40vs.78 on glucose scale). Consumption of the carob preload decreased the glycemic response to a following meal and subjects’ feeling of hunger, desire to eat, preoccupation with food, and thirst between snack and meal, as assessed with the use of VAS. Subsequently, subjects consumed less amount of food (g) and had lower total energy intake at meal. Conclusions: The carob snack led to increased satiety, lower energy intake at meal and decreased post-meal glycemic response possibly due to its low GI value. Identifying foods that promote satiety and decrease glycemic response without increasing the overall energy intake may offer advantages to body weight and glycemic control

    The public health risk posed by Listeria monocytogenes in frozen fruit and vegetables including herbs, blanched during processing

    Get PDF
    A multi-country outbreak ofListeria monocytogenesST6 linked to blanched frozen vegetables (bfV)took place in the EU (2015–2018). Evidence of food-borne outbreaks shows thatL. monocytogenesisthe most relevant pathogen associated with bfV. The probability of illness per serving of uncooked bfV,for the elderly (65–74 years old) population, is up to 3,600 times greater than cooked bfV and verylikely lower than any of the evaluated ready-to-eat food categories. The main factors affectingcontamination and growth ofL. monocytogenesin bfV during processing are the hygiene of the rawmaterials and process water; the hygienic conditions of the food processing environment (FPE); andthe time/Temperature (t/T) combinations used for storage and processing (e.g. blanching, cooling).Relevant factors after processing are the intrinsic characteristics of the bfV, the t/T combinations usedfor thawing and storage and subsequent cooking conditions, unless eaten uncooked. Analysis of thepossible control options suggests that application of a complete HACCP plan is either not possible orwould not further enhance food safety. Instead, specific prerequisite programmes (PRP) andoperational PRP activities should be applied such as cleaning and disinfection of the FPE, water control,t/T control and product information and consumer awareness. The occurrence of low levels ofL. monocytogenesat the end of the production process (e.g.<10 CFU/g) would be compatible with thelimit of 100 CFU/g at the moment of consumption if any labelling recommendations are strictly followed(i.e. 24 h at 5°C). Under reasonably foreseeable conditions of use (i.e. 48 h at 12°C),L. monocytogeneslevels need to be considerably lower (not detected in 25 g). Routine monitoring programmes forL. monocytogenesshould be designed following a risk-based approach and regularly revised based ontrend analysis, being FPE monitoring a key activity in the frozen vegetable industry

    Microbiological safety of aged meat

    Get PDF
    The impact of dry-ageing of beef and wet-ageing of beef, pork and lamb on microbiological hazards and spoilage bacteria was examined and current practices are described. As ‘standard fresh’ and wet-aged meat use similar processes these were differentiated based on duration. In addition to a description of the different stages, data were collated on key parameters (time, temperature, pH and aw) using a literature survey and questionnaires. The microbiological hazards that may be present in all aged meats included Shiga toxin-producing Escherichia coli (STEC), Salmonella spp., Staphylococcus aureus, Listeria monocytogenes, enterotoxigenic Yersinia spp., Campylobacter spp. and Clostridium spp. Moulds, such as Aspergillus spp. and Penicillium spp., may produce mycotoxins when conditions are favourable but may be prevented by ensuring a meat surface temperature of −0.5 to 3.0°C, with a relative humidity (RH) of 75–85% and an airflow of 0.2–0.5 m/s for up to 35 days. The main meat spoilage bacteria include Pseudomonas spp., Lactobacillus spp. Enterococcus spp., Weissella spp., Brochothrix spp., Leuconostoc spp., Lactobacillus spp., Shewanella spp. and Clostridium spp. Under current practices, the ageing of meat may have an impact on the load of microbiological hazards and spoilage bacteria as compared to standard fresh meat preparation. Ageing under defined and controlled conditions can achieve the same or lower loads of microbiological hazards and spoilage bacteria than the variable log10 increases predicted during standard fresh meat preparation. An approach was used to establish the conditions of time and temperature that would achieve similar or lower levels of L. monocytogenes and Yersinia enterocolitica (pork only) and lactic acid bacteria (representing spoilage bacteria) as compared to standard fresh meat. Finally, additional control activities were identified that would further assure the microbial safety of dry-aged beef, based on recommended best practice and the outputs of the equivalence assessment.info:eu-repo/semantics/publishedVersio

    Evaluation of the application for a new alternative processing method for animal by-products of Category 3 material (ChainCraft B.V.)

    Get PDF
    EFSA received an application from the Dutch Competent Authority, under Article 20 of Regulation (EC) No 1069/2009 and Regulation (EU) No 142/2011, for the evaluation of an alternative method for treatment of Category 3 animal by‐products (ABP). It consists of the hydrolysis of the material to short‐carbon chains, resulting in medium‐chain fatty acids that may contain up to 1% hydrolysed protein, for use in animal feed. A physical process, with ultrafiltration followed by nanofiltration to remove hazards, is also used. Process efficacy has been evaluated based on the ability of the membrane barriers to retain potential biological hazards present. Small viruses passing the ultrafiltration membrane will be retained at the nanofiltration step, which represents a Critical Control Point (CCP) in the process. This step requires the Applicant to validate and provide certification for the specific use of the nanofiltration membranes used. Continuous monitoring and membrane integrity tests should be included as control measures in the HACCP plan. The ultrafiltration and nanofiltration techniques are able to remove particles of the size of virus, bacteria and parasites from liquids. If used under controlled and appropriate conditions, the processing methods proposed should reduce the risk in the end product to a degree which is at least equivalent to that achieved with the processing standards laid down in the Regulation for Category 3 material. The possible presence of small bacterial toxins produced during the fermentation steps cannot be avoided by the nanofiltration step and this hazard should be controlled by a CCP elsewhere in the process. The limitations specified in the current legislation and any future modifications in relation to the end use of the product also apply to this alternative process, and no hydrolysed protein of ruminant origin (except ruminant hides and skins) can be included in feed for farmed animals or for aquaculture

    Evaluation of a multi-step catalytic co-processing hydrotreatment for the production of renewable fuels using Category 3 animal fat and used cooking oils

    Get PDF
    An alternative method for the production of renewable fuels from rendered animal fats (pretreated using methods 1–5 or method 7 as described in Annex IV of Commission Regulation (EC) No 2011/142) and used cooking oils, derived from Category 3 animal by-products, was assessed. The method is based on a catalytic co-processing hydrotreatment using a middle distillate followed by a stripping step. The materials must be submitted to a pressure of at least 60 bars and a temperature of at least 270°C for at least 4.7 min. The application focuses on the demonstration of the level of reduction of spores from non-pathogenic spore-forming indicator bacterial species (Bacillus subtilis and Desulfotomaculum kuznetsovii), based on a non-systematic review of published data and additional extrapolation analyses. The EFSA BIOHAZ Panel considers that the application and supporting literature contain sufficient evidence that the proposed alternative method can achieve a reduction of at least 5 log10 in the spores of B. subtilis and a 12 log10 reduction in the spores of C. botulinum. The alternative method under evaluation is considered at least equivalent to the processing methods currently approved in the Commission Regulation (EU) No 2011/142.info:eu-repo/semantics/publishedVersio
    • 

    corecore